The early history of current Algebra

نویسندگان

  • Herbert Pietschmann
  • Geoffrey F. Chew
چکیده

The history of Current Algebra is reviewed up to the appearance of the Adler-Weisberger sum rule. Particular emphasis is given to the role of current algebra in the historical struggle in strong interaction physics of elementary particles between field theory and the S-matrix approach based on dispersion relations. The question as to whether some particles are truly fundamental or all hadrons are bound or resonant states of one another played an important role in this struggle and is thus also regarded. 1 S-Matrix versus Quantum Field Theory in the 1960’s Shortly following the end of World War 2, particle physics received its relativistic and mathematically sound basis by the new renormalization theory; among many contributors, Feynman, Schwinger and Tomonaga were distinguished by the Nobel Prize of 1965 “for their fundamental work in quantum electrodynamics, with deepploughing consequences for the physics of elementary particles”. For the first time, it was possible to calculate higher-order corrections (“loop diagrams”), the most famous examples were the anomalous magnetic moment of electrons and muons, and the Lamb shift. However, this great victory in Quantum Electrodynamics was overshadowed by the fact that it did not work as well for strong interactions and not at all for weak interactions (except for the lowest order, which worked astonishingly well). Thus it became necessary to look for a different approach. In 1956, at the 6th annual Rochester conference taking place in New York, M. Gell-Mann suggested to use dispersion relations to calculate observable quantities. Two years later, Mandelstam published a historic paper on double dispersion relations [Mendelstam 1958]; (soon they became known as the “Mendelstam representation”). In the introduction, Mandelstam wrote: “In recent years dispersion relations have been used to an increasing extent in pion physics for phenomenological and semi-phenomenological analyses of experimental data, and even for the calculation of certain quantities in terms of the pion-nucleon scattering amplitude. It is therefore tempting to ask the question whether or not the dispersion relations can actually replace the more usual equations of field theory and be used to calculate all observable quantities in terms of a finite number of coupling constants”. a e-mail: [email protected] 76 The European Physical Journal H The idea to replace field theory was taken up by a number of theoreticians who did not like the concept of renormalization, for it involves the notion of unobservable hence unphysical “bare” particles and the renormalization procedure which used mathematically ill-defined quantities. From the point of view of positivism, the notion of “relativistic field”, which is not directly observable, was also of concern for them. Most vigorously, Geoffrey F. Chew fought for a new approach. He wanted to base particle theory on observable quantities only. S-Matrix (scattering matrix) elements should replace quantum fields with the postulate [Chew 1961]: “The S matrix is a Lorentz-invariant analytic function of all momentum variables with only those singularities required by unitarity”. Quantum Field Theory was originally designed to describe electromagnetic interactions, i.e. the interaction of photons with electrons and muons. Strongly interacting particles, the hadrons, formed a much larger sample. At the time, eight fermions (p, n, Λ, Σ, Σ, Σ−, Ξ, Ξ−) plus their anti-particles and seven bosons (π, π, π−, K 1 ,K 0 2 , K +,K−) were known. The question arose as to whether some of them were truly elementary with the others just bound states of the former. Chew had in mind an even more progressive change in attitude. He wanted to develop a theory in which the difference between elementary and composite particles would disappear: “It is difficult, however, to imagine a calculation sufficiently complete to approach a definite answer to the question: which of the strongly interacting particles are elementary? Partly because of this circumstance, but even more so because of general philosophical conviction, I am convinced that there can be only one sensible answer, and that is that none of them is elementary. ... In particular there is a remark often made privately by Feynman that tends to convert the negative statement into a positive one. Paraphrasing Feynman: the correct theory should be such that it does not allow one to say which particles are elementary. Such a concept is manifestly at odds with the spirit of conventional field theory, but it forms a smooth alliance with the S-matrix approach”. Chew’s approach soon became known as the “bootstrap-mechanism” and – for some time – was taken up by quite a number of theorists. Some went so far as to request that quantum field theory should be abandoned from the curriculum of universities. B. Schroer describes this struggle with the following words [Schroer 2010]: “This led to a confrontation of the S-matrix bootstrap with Quantum Field Theory at the end of the 60s. It was a struggle about a pure S-matrix approach cleansed of all field theoretic aspects; ... The ideological fervor found its strongest expression in conference reports were the S-matrix bootstrap proponents felt more free to celebrate what they perceived as their (premature) victory over Quantum Field Theory. ... The ferocity of the struggle on the side of the S-matrix purist against Quantum Field Theory is hard to understand in retrospect,... ”. Let us recall that Quantum Field Theory starts from so-called “bare” particles without interaction. (They carry “bare” masses and coupling constants.) When the interaction is turned on, these quantities change into “physical” masses and coupling constants. The difference can – in principle – be calculated, but the relevant integrals do not converge, i.e. they turn out to be infinite (mathematically speaking: they do not exist!). In so-called “renormalizable theories” (including quantum electrodynamics!) observable quantities do not depend on these infinite integrals. Thus by a largely arbitrary “regularization”, the integrals can be forced to converge, because they do not enter in the observables. Although – as mentioned – results were beautifully compatible with experiments (e.g. the magnetic moment of electrons and muons), the procedure left some uneasiness even with its advocates: “Infinities are swept under the rug!” was often bemoaned. Let me quote extensively from Chew in order to shed some light on the situation [Chew 1961]: “So that there can be no misunderstanding ... let me say at once that Herbert Pietschmann: The early history of current Algebra 77 I believe the conventional association of fields with strongly interacting particles to be empty. I do not have firm convictions about leptons or photons, but it seems to me that no aspect of strong interactions has been clarified by the field concept. ... I do not wish to assert ... that conventional field theory is necessarily wrong, but only that it is sterile with respect to strong interactions and that, like an old soldier, it is destined not to die but just to fade away. Having made this point so strongly, I hasten to express an unqualified appreciation of the historical role played by field theory up to the present. ... However, it is my impression now that ... future development of an understanding of strong interactions will be expedited if we eliminate from our thinking such field-theoretical notions as Lagrangians, “bare” masses, “bare” coupling constants, and even the notion of “elementary particles”. I believe, in other words, that in the future we should work entirely within the framework of the analytically continued S matrix”. But already in the preface Chew concedes: “Readers should be aware that S-matrix theory is still incomplete ... . Progress is currently rapid, and a complete theory may well develop within a few years’ time. (A paper by Stapp, currently in press, makes a major step in this direction.)” The mathematically most rigorous results of field theory were the CPT theorem and the connection between spin and statistics. With very few assumptions, invariance under the combined charge conjugation (C), parity (P) and time-reversal (T) was proven, though neither of the three alone is conserved. Likewise it was proven that particles with integer spin (bosons) follow Bose-Einstein statistics and particles with half integer spin (fermions) obey Fermi-Dirac statistics. In the abstract of the mentioned paper Stapp writes [Stapp 1962]: “The CPT theorem and the normal connection between spin and statistics are shown to be consequences of postulates of the S-matrix approach to elementary particle physics. The postulates are much weaker than those of field theory. Neither local fields nor any reference to space-time points are used. Quantum commutation relations and properties of the vacuum play no role. Completeness of the asymptotic states and positive definiteness of the metric are not required, though certain weaker asymptotic conditions prevail. The proofs depend on unitarity, macroscopic relativistic invariance, and a very weak analyticity requirement on the mass-shell scattering functions. The proofs are in the framework of the new S-matrix approach to elementary particle physics, which is established on a formal basis”. Stapp concedes that the CPT Theorem and the connection between spin and statistics are “the two most important general physical consequences of relativistic field theory” but he objects to field theory for “it is not known whether the postulates are sufficiently realistic to include any theories except trivial ones in which the scattering matrix is unity.” Moreover, from a philosophical point of view he insists that “experience does not entail the existence of space-time points” which are at the basis of relativistic quantum field theory. He insists: “Because space-time points are 1 Consequently, the physics of leptons and photons as well as of weak interactions found itself in a marginal position at large conferences. This led to the creation of new types of conferences. 1962 saw the first of a now well established series, the “Conference on electron-photon interactions in the BeV energy range” in Boston, now called “International Lepton-Photon Symposium”. Marx created the series of conferences on neutrino physics and astrophysics, the first of which took place at Lake Balaton in Hungary 1972 and the next year saw the first “Workshop on Weak Interactions with Very High Energy Beams”, created by Jan Nilsson and the author in Skövde, Sweden, now called Workshop on Weak Interactions and Neutrinos (WIN). 2 A comprehensive description can be found in Streater and Wightman: PCT, Spin and Statistics, and All That. Reprinted by Addison-Wesley, New York (1989). 78 The European Physical Journal H experimentally inaccessible both in practice and in principle, their introduction runs counter to the philosophy of quantum mechanics”. On the other hand, according to Stapp, the “S-matrix approach to elementary particle physics is ... approaching the status of an independent theory, its connections to field theory gradually being dissolved. ... History encourages the casting away of formal substructures whose ingredients have no counterparts in experience ... . The new approach, since it involves only observable quantities and their analytic continuations, has a claim to probable physical relevance much greater than that of field theory, with its sundry hypothetical ingredients of dubious status”. Concluding, Stapp writes: “It has been shown how the appeal to field theoretic concepts can be completely avoided and the new S-matrix formalism built up from simple principles that are relatively secure”. Unsurprisingly, this far reaching claim met fierce opposition from field theory adherents. One of the most outspoken opponents was Res Jost [Jost 1963]. He wrote: “The main vice of Mr. Stapp is the fact that he does not properly reflect upon the problems”. and “It seems that Mr. Stapp does not shy away from arbitrarily criticising other physicists, but for his own case he is missing those self criticism and precision of formulating which one is used to require from a physics paper”. This fierce controversy on the philosophical side was paralleled from the experimental side by the unexpected discovery of a multitude of strongly interacting particles (hadrons). 2 Truly elementary particles or “particle democracy”? In November 1959, the Proton Synchrotron at CERN began to operate at an energy of 28 GeV. A few months later, a similar machine – the Alternating Gradient Synchrotron at Brookhaven National Laboratory – reached an energy of 33 GeV. Soon after their inauguration, the new machines produced an unexpectedly large number of short-lived hadrons, then called “resonances”. Although their life-times of about 10−22 s were very short, there was little principal difference to some of the “stable” particles, e.g. the neutral pion with a life-time of 10−16 s. This avalanche of new “elementary” particles further enhanced the contradiction between the idea of “bootstrap” and field theory based on truly elementary entities. In the midst of this controversy, a totally new idea was created by Gell-Mann [1964] and Zweig [1964]: Neither should there be no elementary particle, nor should some out of the “zoo” be more elementary than the others; rather, a totally new class of particles – unobserved at that time – should be the constituents of all the known “elementary particles”. These truly fundamental entities were called “quarks” by Gell-Mann (with reference to Finnegans Wake by James Joyce) and – inspired by card games – “aces” by Zweig. Gell-Mann left it open, whether they were actual physical particles or “mathematical entities”: “It is fun to speculate about the way 3 German original: Das Hauptübel besteht bei Herrn Stapp eben darin, dass er sich die Probleme nicht eigentlich überlegt hat ... 4 German original: Uns scheint, dass Herr Stapp, der es anderen Physikern gegenüber an wahlloser Kritik nicht fehlen lässt, in eigener Sache auch diejenige Selbstkritik und Schärfe der Formulierung vermissen lässt, die man von einer physikalischen Arbeit zu verlangen gewohnt ist ... 5 First individual lists of the new states were published in 1963 [Balachandran 1963a; Roos 1963]. From 1964, a group of physicists started to publish annual reviews of particle properties [Rosenfeld 1964], since 1969 it is officially quoted as “Particle Data Group”. Herbert Pietschmann: The early history of current Algebra 79 quarks would behave if they were physical particles of finite mass, instead of purely mathematical entities as they would be in the limit of infinite mass”. This idea was so out of the conventional thinking of the time, that in the beginning, most of the physicists simply did not take it seriously. After all, the charges of these “fundamental particles” were predicted to be 2e/3 and –e/3 (e = fundamental charge). This seemed to be in contradiction with the classic experiments of Millikan from 1911. Consequently, Zweig could not publish his paper and Gell-Mann sent his paper to the relatively new “Physics Letters”; before elaborating on his alternative idea, Gell-Mann concedes: “A highly promising approach is the purely dynamical ‘bootstrap’ model for all the strongly interacting particles within which one may try to derive ... broken eightfold symmetry from self-consistency alone”. The astonishment of the majority of physicists is best disclosed by Lipkin, who sent a paper to Physics Letters which was originally intended as a joke [Lipkin 1964]. (The original title was “The barbaryon classification for elementary particles SU (3)”.) He writes: “Since possibilities are now being considered which seem just as fantastic as the barbaryon classification, perhaps the latter is not so crazy after all and deserves more serious consideration”. The astonishment as well as excitement of the physics community is best recalled by the issue of CERN Courier from March 1964: “From the time of Millikan’s classic experiments in 1911 it has been accepted that the charge of the electron is the smallest one possible. The idea of fractionally charged particles seemed quite preposterous. Even those who suggested it seemed to share the doubts; ... For experimentalists, the excitement lay in the prediction that at least one of the new particles would be stable. ... The Electronics Experiments Committee, on 11 February, decided that the particles should be taken seriously. ... In any case, it quickly became clear that the combination of a bubble chamber and the o2 beam in the PS East hall provided the quickest way of looking for them. ... While working on this proposal, D R O Morrison realized that the same kind of bubblechamber exposure had in fact been carried out with the CERN 32 cm chamber in 1960. The photographs were got out and a team of physicists and scanners looked through 10 000 of them in one night. No aces were found. The group working with the Ecole Polytechnique heavy-liquid bubble chamber scanned 100 000 photographs. Again the result was negative. ... Zweig’s aces and Gell-Mann’s quarks may or may not be found, but their ideas have triggered off a new series of moves in this search for an explanation of the occurrence of the so-called fundamental particles”. Particles with fractional charge were indeed found in cosmic ray events! McCusker and Cairns write [McCusker 1969; Kasha 1966]: “In one year from July 1968 we found four tracks whose appearance was that expected for a quark of charge 2e/3”. But the result could not be reproduced. About ten years later, free quarks turned up again. A Millikan-type experiment showed fractional charges on Niobium balls [LaRue 1977]. It was not sufficient objection that no such effects were found in similar experiments with steel balls [Gallinaro 1966; Marinelli 1982]. After all, quarks could have a preference for certain elements. In order to rule out this possibility, the experiment had to be repeated with Niobium [Smith 1986]. This is a fine example for the author’s claim [Pietschmann 1996; 2007]: “Everything that is predicted by a sufficiently renowned theorist will be discovered, irrespective of its actual existence; that is why in physics the criterion for existence is not a discovery, but only the proof of reproducibility!” From these discussions, it was remembered that even at the time of Millikan there was a great controversy [e.g. Niaz 2000]. Felix Ehrenhaft at the University of Vienna who had independently invented the method used by Millikan, kept producing 80 The European Physical Journal H fractional charges. Thus on June 11, 1980 Zweig wrote the following letter to the author: “Two students of mine are repeating the Millikan oil drop experiment using a number of materials, including selenium, instead of oil. In Johanna Fürst’s 1920 dissertation from the Third Physics Institute of the University of Vienna, the measured charges an 150 selenium spheres are given. These results, which are striking to the modern eye, are published by Felix Ehrenhaft in Physik. Zeitschr. 39, 673 (1938). ... Please note the two large peaks at charge 1 and 2/3. There are so many possible explanations for the peak at 2/3. For example, if the selenium was in the form of two spheres sticking together and had a net charge of 1, then the apparent charge coming out of Fürst’s analysis would have been less than one. Nevertheless, the possibility, however remote, that free quarks were present in the selenium is still there. Consequently we would like to have a sample of the selenium used by Fürst in 1920 (and perhaps by Ehrenhaft at a later time). As a favour, would you please find out if this selenium still exists? If so, please send a sample. If not, is it possible to find out who manufactured this selenium? If all else fails, would it be possible to get some selenium that was sold in Vienna before 1920? Perhaps there is an old bottle stored away in one of the chemical stockrooms. Thank you for your assistance. Sincerely yours George Zweig.” Unfortunately I could not be of any help. Johanna Reif-Fürst had passed away and among many materials from that time selenium was neither left over nor was its source traceable. Two former assistants of Ehrenhaft had become Professors of physics, but their search did not produce any result either. Eventually, quarks were discovered in a reproducible way, however neither as free particles nor as mathematical entities, rather confined within the nucleons. The Nobel prize in physics was given in 1990 to Friedman, Kendall and Taylor “for their pioneering investigations concerning deep inelastic scattering of electrons on protons and bound neutrons, which have been of essential importance for the development of the quark model in particle physics”. Quantum-Chromodynamics now provides the theory capable of predicting the phenomenon of confinement. In a certain sense, history seems to repeat itself periodically on ever deeper levels of understanding: in its October 2010 issue (p. 6), the CERN Courier writes: “The ATLAS experiment at the LHC has set the world’s best known limits for the mass of a hypothetical excited quark, q*... . The existence of such a state would indicate that a quark is a composite particle as opposed to an elementary one as the Standard Model assumes.” 3 An observable renormalization Let us turn back to the controversy between S-Matrix and Field Theory approaches. If it were possible to construct a model which can only be described by unitarity and dispersion relations, but does not follow from a Lagrangian (or Hamiltonian), it would mean victory for the S-Matrix approach. In 1961, F. Zachariasen claimed exactly that breakthrough [Zachariasen 1961]. He writes: “We shall construct a model field theory which is perhaps unusual in that it is not defined in terms of a Lagrangian or Hamiltonian; ... Instead we shall assume the existence of a complete set of dispersion relations, which, together with unitarity, form a set of coupled integral equations for the transition amplitudes of the theory”. And, more explicitly: “Conventionally, field theories are defined by specifying a Lagrangian density, and from this obtaining field equations, perturbation expansions, Herbert Pietschmann: The early history of current Algebra 81 and so on. It has been suggested, however, that field theories can equally well be defined by writing down a set of dispersion relations which, when combined with unitarity, provide an infinite set of coupled integral equations from which all the transition amplitudes of the theory may be determined. This second approach has the virtue of not involving any unobservable quantities, such as bare masses or coupling constants, at any stage of the development”. Within the same year, W. Thirring published a paper called “Lagrangian Formulation of the Zachariasen Model” [Thirring 1962]. He writes: “Our results have shown that there exists a Lagrangian for the Zachariasen model which has the same short-comings as the one of other relativistic field theories: It contains renormalization constants when expressed in terms of bare fields but everything is perfectly finite when the incoming or outcoming fields are introduced. ... Our findings make one suspect that field theories without Hamiltonian, which are defined only by dispersion relations, actually have an underlying Hamiltonian structure”. We have seen that the defenders of the S-Matrix approach refer to Gell-Mann’s 1956 remark at the Rochester Conference in New York. But in his fundamental paper from 1962 he cautiously turns towards field theory [Gell-Mann 1962]. In this paper, Gell-Mann considers the algebraic structure of the electric current jα and the weak current Jα. He notes that matrix elements of these currents are well defined and obey dispersion relations; however, “homogeneous linear dispersion relations, even without subtractions, do not suffice to fix the scale of these matrix elements; in particular, for the nonconserved currents, the renormalization factors cannot be calculated,...”. In other words, the important ratio of weak axial over weak vector nucleon current, GA/GV , cannot be calculated by linear relations. Since it is an ingredient in many fundamental relations (such as the Goldberger-Treiman-Relation), its theoretical computation was considered an important task. Thus Gell-Mann continues: “More information than just the dispersion relations must be supplied, for example, by fieldtheoretic models; we consider, in fact, the equal-time commutation relations of the various parts of j4 and J4. These nonlinear relations define an algebraic system (or a group) that underlies the structure of baryons and mesons”. An internal symmetry is defined by the algebra of its generators [Iα, Iβ ] = cαβγIγ . (1) (In the case of Isospin, the cαβγ are the components of the ε-Tensor times i.) The generators, in turn, are given by the integral over the time-component of the currents Iα = ∫ dxJ0,α(x). (2) From these equations one obtains the equal-time commutation relation of the currents [J0,α (−→x ) , J0,β (−→y )] = cαβγJ0,γ (−→x ) δ (−→x −−→y ) . (3) Since these are nonlinear equations, they allow for a computation of GA which is the matrix element of the axial-vector current between proton and neutron states at zero momentum transfer. In the text, Gell-Mann points this out explicitly: “The dispersion relations for the matrix elements of weak or electromagnetic currents are linear and homogeneous. ... Now such linear and homogeneous equations ... cannot fix the scale of these matrix elements; constants like −GA/G cannot be calculated without further information. 6 Zachariasen refers to the talk of Gell-Mann at the Rochester Conference in New York 1956, see Chap. 1. 7 The time-components of the currents, now usually denoted by j0 and J0. 82 The European Physical Journal H A field theory of the strong interactions, with explicit expressions for the currents, somehow contains more than these dispersion relations”. But in the final paragraph he insists: “Nowhere does our work conflict with the program of Chew et al. of dynamical calculation of the S matrix for strong interactions, using dispersion relations. ... If there are no fundamental fields ... , all baryons and mesons being bound or resonant states of one another, ... the symmetry properties that we have abstracted can still be correct”. In a sense, the axial-vector constant GA is an observable renormalization constant and as such weakens the objection of S-Matrix defenders against field theory because of its unobservable renormalization. Gell-Mann suggests that “the equal-time commutation relations for currents and densities lead to exact sum rules for the weak and electromagnetic matrix elements”. This suggestion eventually led to the technique of current algebra and yielded the long awaited computation of GA/GV , in Gell-Mann’s notation −GA/G.

برای دانلود رایگان متن کامل این مقاله و بیش از 32 میلیون مقاله دیگر ابتدا ثبت نام کنید

ثبت نام

اگر عضو سایت هستید لطفا وارد حساب کاربری خود شوید

منابع مشابه

A History of Selected Topics in Categorical Algebra I: From Galois Theory to Abstract Commutators and Internal Groupoids

This paper is a chronological survey, with no proofs, of a direction in categorical algebra, which is based on categorical Galois theory and involves generalized central extensions, commutators, and internal groupoids in Barr exact Mal’tsev and more general categories. Galois theory proposes a notion of central extension, and motivates the study of internal groupoids, which is then used as an a...

متن کامل

Characteristics of patients with phenylketonuria in Mazandaran Province, northern, Iran

Background: Phenylketonuria (PKU) is an autosomal recessive disease of Phenylalanine metabolism that brings deficiency of the enzyme Phenylalanine Hydroxylase (PAH). Early diagnosis is very important to prevent complications. This study was designed to describe characteristics of patients with phenylketonuria in Mazandaran Province in northern Iran. Methods: We studied 24 cases suffering from P...

متن کامل

Political archaeology and the Growth of nationalism in historiography of Iran in early twenty century: the Case of Pirniya’s Ancient History

It is argued that whenever the political situation provides a favourable environment, the archaeological activities have been encouraged to provoke nationalism, and historians in various capacities have used archaeological data and historical records to advocate nationalist agendas.  Owing to its rich archaeological and historical past and its contemporary socio-cultural diversity, Iran is, par...

متن کامل

The Role of Residential Early Parenting Services in Increasing Parenting Confidence in Mothers with A History of Infertility

Objective Mothers with a history of infertility may experience parenting difficulties and challenges. This study was conducted to investigate the role of residential early parenting services in increasing parenting confidence in mothers with a history of infertility. MaterialsAndMethods This was a retrospective chart review study using the quantitative data from the clients attending the Karita...

متن کامل

Depositional History and Sequence Stratigraphy of the Tirgan Formation (Barremian – Aptian) in the Zavin section, NE Iran

The Tirgan Formation (Barremian – Aptian) is exposed in the Kopet Dagh in northeast Iran. One stratigraphic section in Zavin was measured with a thickness of 110 meters. This Formation in section consists of three parts (including lower carbonate, limy shale – marl and upper carbonate rocks). Based on the study of 94 thin sections, 10 carbonate and 2 siliciclastic lithofacies have been identifi...

متن کامل

Universal Central Extension of Current Superalgebras

Representation as well as central extension are two of the most important concepts in the theory of Lie (super)algebras. Apart from the interest of mathematicians, the attention of physicist are also drawn to these two subjects because of the significant amount of their applications in Physics. In fact for physicists, the study of projective representations of Lie (super)algebras  are very impo...

متن کامل

ذخیره در منابع من


  با ذخیره ی این منبع در منابع من، دسترسی به آن را برای استفاده های بعدی آسان تر کنید

عنوان ژورنال:

دوره   شماره 

صفحات  -

تاریخ انتشار 2011